4 research outputs found
What does touch tell us about emotions in touchscreen-based gameplay?
This is the post-print version of the Article. The official published version can be accessed from the link below - Copyright @ 2012 ACM. It is posted here by permission of ACM for your personal use. Not for redistribution.Nowadays, more and more people play games on touch-screen mobile phones. This phenomenon raises a very interesting question: does touch behaviour reflect the player’s emotional state? If possible, this would not only be a valuable evaluation indicator for game designers, but also for real-time personalization of the game experience. Psychology studies on acted touch behaviour show the existence of discriminative affective profiles. In this paper, finger-stroke features during gameplay on an iPod were extracted and their discriminative power analysed. Based on touch-behaviour, machine learning algorithms were used to build systems for automatically discriminating between four emotional states (Excited, Relaxed, Frustrated, Bored), two levels of arousal and two levels of valence. The results were very interesting reaching between 69% and 77% of correct discrimination between the four emotional states. Higher results (~89%) were obtained for discriminating between two levels of arousal and two levels of valence
Touching virtual agents: embodiment and mind
In this paper we outline the design and development of an embodied conversational agent setup that incorporates an augmented reality screen and tactile sleeve. With this setup the agent can visually and physically touch the user. We provide a literature overview of embodied conversational agents, as well as haptic technologies, and argue for the importance of adding touch to an embodied conversational agent. Finally, we provide guidelines for studies involving the touching virtual agent (TVA) setup